Convex formulations of radius-margin based Support Vector Machines

نویسندگان

  • Huyen Do
  • Alexandros Kalousis
چکیده

We consider Support Vector Machines (SVMs) learned together with linear transformations of the feature spaces on which they are applied. Under this scenario the radius of the smallest data enclosing sphere is no longer fixed. Therefore optimizing the SVM error bound by considering both the radius and the margin has the potential to deliver a tighter error bound. In this paper we present two novel algorithms: RSVMμ—a SVM radius-margin based feature selection algorithm, and R-SVM— a metric learning-based SVM. We derive our algorithms by exploiting a new tighter approximation of the radius and a metric learning interpretation of SVM. Both optimize directly the radius-margin error bound using linear transformations. Unlike almost all existing radius-margin based SVM algorithms which are either non-convex or combinatorial, our algorithms are standard quadratic convex optimization problems with linear or quadratic constraints. We perform a number of experiments on benchmark datasets. R-SVMμ exhibits excellent feature selection performance compared to the state-of-the-art feature selection methods, such as L1-norm and elasticnet based methods. R-SVM achieves a significantly better classification performance compared to SVM and its other state-of-theart variants. From the results it is clear that the incorporation of the radius, as a means to control the data spread, in the cost function has strong beneficial effects. Proceedings of the 30 th International Conference on Machine Learning, Atlanta, Georgia, USA, 2013. JMLR: W&CP volume 28. Copyright 2013 by the author(s).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A QUADRATIC MARGIN-BASED MODEL FOR WEIGHTING FUZZY CLASSIFICATION RULES INSPIRED BY SUPPORT VECTOR MACHINES

Recently, tuning the weights of the rules in Fuzzy Rule-Base Classification Systems is researched in order to improve the accuracy of classification. In this paper, a margin-based optimization model, inspired by Support Vector Machine classifiers, is proposed to compute these fuzzy rule weights. This approach not only  considers both accuracy and generalization criteria in a single objective fu...

متن کامل

Tuning L1-SVM Hyperparameters with Modified Radius Margin Bounds and Simulated Annealing

In the design of support vector machines an important step is to select the optimal hyperparameters. One of the most used estimators of the performance is the Radius-Margin bound. Some modifications of this bound have been made to adapt it to soft margin problems, giving a convex optimization problem for the L2 soft margin formulation. However, it is still interesting to consider the L1 case du...

متن کامل

Escaping the Convex Hull with Extrapolated Vector Machines

Maximum margin classifiers such as Support Vector Machines (SVMs) critically depends upon the convex hulls of the training samples of each class, as they implicitly search for the minimum distance between the convex hulls . We propose Extrapolated Vector Machines (XVMs) which rely on extrapolations outside these convex hulls. XVMs improve SVM generalization very significantly on the MNIST [7] O...

متن کامل

Escaping the Convex Hull with Extrapolated Vector Machines

Extrapolated Vector Machines. Patrick Ha ner AT&T Labs-Research, 200 Laurel Ave, Middletown, NJ 07748 [email protected] Abstract Maximum margin classi ers such as Support Vector Machines (SVMs) critically depends upon the convex hulls of the training samples of each class, as they implicitly search for the minimum distance between the convex hulls. We propose Extrapolated Vector Machines...

متن کامل

Feature Weighting Using Margin and Radius Based Error Bound Optimization in SVMs

The Support Vector Machine error bound is a function of the margin and radius. Standard SVM algorithms maximize the margin within a given feature space, therefore the radius is fixed and thus ignored in the optimization. We propose an extension of the standard SVM optimization in which we also account for the radius in order to produce an even tighter error bound than what we get by controlling...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013